Second Order Equations: Wronskians and Independence

Edmund Chiang
MATH2351 / 2352 — Boyce & DiPrima §3.2–3
February 19, 2026

1   Existence and Uniqueness

1.1   The Main Theorem

Theorem — Existence & Uniqueness

The IVP

$$L(y) = y'' + p(t)y' + q(t)y = g(t), \qquad y(t_0) = y_0,\; y'(t_0) = y_0'$$

where $p(t)$, $q(t)$ and $g(t)$ are continuous on an open interval $I$ containing the point $t_0$ always admits a unique solution $y(t) = \phi(t)$ on the interval $I$.

For students who are mathematically matured, we refer to Chapter 6, §8 of E. A. Coddington, "An Introduction to Ordinary Differential Equations" (Englewood Cliffs, NJ: Prentice-Hall, 1961; New York: Dover, 1989) for a proof.


1.2   Examples

Example. Find the longest interval $I$ in which the IVP

$$(t^3 - 3t)y'' + ty' - (t + 3)y = 0, \quad y(1) = 2,\; y'(1) = 1$$

has a unique solution.

Let us rewrite the DE as

$$y'' + \frac{t}{(t^3 - 3t)}y' - \frac{(t+3)}{(t^3 - 3t)}y = 0.$$

That is, we have

$$p(t) = \frac{1}{t - 3}, \quad q(t) = \frac{(t+3)}{(t^3 - 3t)}$$

where $p$ is continuous over $t < 3$ and $t > 3$, whilst $q$ is continuous over $t < 0$, $0 < t < 3$ and $t > 3$. Our IVP involves the point $t_0 = 1$ that lies in the intersection of $t < 3$ and $0 < t < 3$, i.e., over $I = (0, 3)$ in which the above theorem guarantees the IVP has a unique solution (without solving the DE explicitly).


Example. Consider the IVP

$$y'' + p(t)y' + q(t)y = g(t), \quad y(t_0) = 0,\; y'(t_0) = 0,$$

where $p(t)$, $q(t)$ are continuous on an open interval $I$ that contains the point $t_0$.

We observe that $y = \phi(t) \equiv 0$ (for all $t$) satisfies the DE and also $\phi(t_0) = 0 = \phi'(t_0)$. So the last Theorem asserts that at $y = \phi(t) \equiv 0$ is the unique solution for this IVP.


Exercise (B&D)

Find the longest intervals in which each of the following IVP is guaranteed to have a solution. There is no need to solve these IVP.

  1. $ty'' + 3y = t$, $y(1) = 1$, $y'(1) = 2$. (Ans. $0 < t < \infty$)
  2. $(t - 1)y'' - 3ty' + 4y = \sin t$, $y(-2) = 2$, $y'(-2) = 1$ (Ans. $-\infty < t < 1$)
  3. $t(t - 4)y'' + 3ty' + 4y = 2$, $y(3) = 0$, $y'(3) = -1$. (Ans. $0 < t < 4$)

2   Wronskians

2.1   Definition

Definition — Wronskian

Let $f(t)$ and $g(t)$ be two functions defined on a common domain $I$. We define a new function

$$W(f, g)(t) = f(t)g'(t) - f'(t)g(t)$$

called the Wronskian (Józef Maria Hoene-Wroński (1776-1853) Polish Messianist philosopher, mathematician, physicist, inventor, lawyer, and economist.) of $f(t)$ and $g(t)$.

We can rewrite

$$W(f, g)(t) = f(t)g'(t) - f'(t)g(t) = \begin{vmatrix} f(t) & g(t) \\ f'(t) & g'(t) \end{vmatrix}$$

So the Wronskian is also called the Wronskian determinant of $f(t)$ and $g(t)$.


2.2   Examples

Example. Recall that $y_1(t) = e^{-2t}$, $y_2(t) = e^{-3t}$ are two solutions of the DE

$$y'' + 5y' + 6 = 0.$$

Then

\begin{align} W(f, g)(t) &= \begin{vmatrix} e^{-2t} & e^{-3t} \\ -2e^{-2t} & -3e^{-3t} \end{vmatrix} \\[6pt] &= e^{-2t}(-3e^{-3t}) - e^{-3t}(-2e^{-2t}) \\[6pt] &= -3e^{-5t} + 2e^{-5t} \\[6pt] &= -e^{-5t}. \end{align}

Example. Let $y_1 = e^{r_1 t}$, $y_2 = e^{r_2 t}$. Then

$$W(y_1, y_2)(t) = \begin{vmatrix} e^{r_1 t} & e^{r_2 t} \\ r_1 e^{r_1 t} & r_2 e^{r_2 t} \end{vmatrix} = (r_1 - r_2)e^{(r_2 + r_1)t}.$$

Example. Let $y_1 = \cos t$, $y_2 = \sin t$. Then

$$W(y_1, y_2)(t) = \begin{vmatrix} \cos t & \sin t \\ -\sin t & \cos t \end{vmatrix} = \cos^2 t + \sin^2 t = 1,$$

for all $t$. So the Wronskian is a constant function.


Exercise — Compute Wronskians

Find the Wronskians of each of the following pairs and determine on which interval the two functions are linearly independent.

  1. $\{e^{-2t},\; e^{3t/2}\}$ (Ans. $\frac{7}{2}e^{-t/2}$, $\mathbb{R}$)
  2. $\{x,\; xe^x\}$ (Ans. $x^2 e^x$, $\mathbb{R}$ except $0$)
  3. $\{\cos^2\theta,\; 1 + \cos 2\theta\}$ (Ans. $0$, nowhere)
  4. $\{e^t\sin t,\; e^t\cos t\}$ (Ans. $-e^{2t}$, $\mathbb{R}$)

Exercise
  1. If the Wronskian $W$ of $f$ and $g$ is $3e^{4t}$, and if $f(t) = e^{2t}$, then find $g(t)$. (Ans. $3te^{2t} + ce^{2t}$)
  2. If $W(f, g)$ is the Wronskian of $f$ and $g$, and if $u = 2f - g$, $v = f + 2g$, find the Wronskian $W(u, v)$ of $u$ and $v$ in terms of $W(f, g)$. (Ans. $5W(f, g)$)

2.3   IVP Theorem

We deduce from the above theorem of existence and uniqueness the following.

Theorem — IVP

Consider the DE

$$L(y) = y'' + p(t)y' + q(t)y = 0, \tag{1}$$

where $p(t)$ and $q(t)$ are continuous on an open interval $I$ with initial condition

$$y(t_0) = y_0, \qquad y'(t_0) = y_0'.$$

Suppose $y_1$, $y_2$ are two solutions to the DE (1). If

$$W(y_1, y_2)(t_0) = y_1(t_0)y_2'(t_0) - y_1'(t_0)y_2(t_0) \neq 0$$

for $t_0$ in $I$, then there are $c_1$, $c_2$ such that the IVP is solved by

$$y(t) = c_1 y_1(t) + c_2 y_2(t).$$

So the main point in the above theorem is that one can find $c_1$, $c_2$ to solve the IVP provided that the Wronskian $W(y_1, y_2)(t_0) \neq 0$.

Proof. Let us substitute $t_0$ into the $y(t)$: \begin{align} y_0 &= y(t_0) = c_1 y_1(t_0) + c_2 y_2(t_0) \\[6pt] y_0' &= y'(t_0) = c_1 y_1'(t_0) + c_2 y_2'(t_0). \end{align}

Solving the two equations for $c_1$ and $c_2$ yields

\begin{align} y_1'(t_0)y_0 &= y_1'(t_0)y(t_0) = c_1 y_1'(t_0)y_1(t_0) + c_2 y_1'(t_0)y_2(t_0), \\[6pt] y_1(t_0)y_0' &= y_1(t_0)y'(t_0) = c_1 y_1(t_0)y_1'(t_0) + c_2 y_1(t_0)y_2'(t_0). \end{align}

Subtracting the two equations yields

\begin{align} y_1'(t_0)y_0 - y_1(t_0)y_0' &= c_2[y_1'(t_0)y_2(t_0) - y_1(t_0)y_2'(t_0)] \\[6pt] &= -c_2 W(y_1, y_2)(t_0). \end{align}

Since the $W(y_1, y_2)(t_0) \neq 0$, so we could solve for $c_2$ by

$$c_2 = \frac{y_1(t_0)y_0' - y_1'(t_0)y_0}{W(y_1, y_2)(t_0)} = \frac{W(y_1, y)(t_0)}{W(y_1, y_2)(t_0)} = \frac{\begin{vmatrix} y_1(t_0) & y_0 \\ y_1'(t_0) & y_0' \end{vmatrix}}{\begin{vmatrix} y_1(t_0) & y_2(t_0) \\ y_1'(t_0) & y_2'(t_0) \end{vmatrix}}.$$

Similarly,

$$c_1 = \frac{y_2'(t_0)y_0 - y_2(t_0)y_0'}{W(y_1, y_2)(t_0)} = \frac{W(y, y_2)(t_0)}{W(y_1, y_2)(t_0)} = \frac{\begin{vmatrix} y_0 & y_2(t_0) \\ y_0' & y_2'(t_0) \end{vmatrix}}{\begin{vmatrix} y_1(t_0) & y_2(t_0) \\ y_1'(t_0) & y_2'(t_0) \end{vmatrix}}.$$

We may rephrase the above IVP theorem into the following statement.

Remark

Suppose $y_1$, $y_2$ are two solutions to the DE

$$L(y) = y'' + p(t)y' + q(t)y = 0,$$

where $p(t)$ and $q(t)$ are continuous on certain interval $I$. If

$$W(y_1, y_2)(t_0) = y_1(t_0)y_2'(t_0) - y_1'(t_0)y_2(t_0) \neq 0$$

for some $t_0$ in $I$, then every solution $\phi(t)$ of the DE can be written as

$$y(t) = c_1 y_1(t) + c_2 y_2(t)$$

for some constants $c_1$ and $c_2$.

Definition — General Solution & Fundamental Set

Let $y_1(t)$ and $y_2(t)$ be two solutions (functions) to the DE

$$L(y) = y'' + p(t)y' + q(t)y = 0,$$

where $p(t)$ and $q(t)$ are continuous on an open interval $I$. If $W(y_1, y_2)(t_0) \neq 0$ for some point $t_0$ in $I$, then we call

  1. $y(t) = c_1 y_1(t) + c_2 y_2(t)$ is the general solution of the DE;
  2. $\{y_1, y_2\}$ is called a fundamental set of solutions of the DE.

3   Abel's Formula

Theorem — Abel

Let $y_1$, $y_2$ be solutions to the DE

$$L(y) = y'' + p(t)y' + qy = 0,$$

where $p(t)$ and $q(t)$ are continuous on an open interval $I$. Then the Wronskian

$$W(y_1, y_2)(t) = c \cdot \exp\Big[-\int p(s)\,ds\Big]$$

for all $t$ in $I$, where $c$ is a constant that depends on $y_1$, $y_2$ but not on $t$. Moreover, either $W(y_1, y_2)(t) \neq 0$ for all $t$ in $I$ or $W(y_1, y_2)(t) \equiv 0$ for all $t$ in $I$.

Proof. We justify Abel's formula. We certainly have \begin{align} y_1'' + p(t)y_1' + qy_1 &= 0 \\[6pt] y_2'' + p(t)y_2' + qy_2 &= 0. \end{align}

We multiply the first Eqn by $-y_2$ and added to the second Eqn. multiply by $y_1$ yields

$$y_2 y_1'' - y_2'' y_1 + p(t)(y_2 y_1' - y_2' y_1) = 0.$$

Let $W(t) = W(y_1, y_2)(t)$. Notice that

$$W'(t) = y_2 y_1'' - y_2'' y_1.$$

So the $W$ satisfies the first order DE:

$$W' + p(t)W(t) = 0.$$

Therefore this equation admits a solution of the form

$$W(y_1, y_2)(t) = c \cdot \exp\Big[-\int p(s)\,ds\Big],$$

from elementary consideration and the constant $c$ is independent of $t$. This completes the derivation.

Remarks
  1. $W(y_1, y_2)(t) \equiv 0$ occurs when $c = 0$;
  2. The above formula works not only for second order DE with constant coefficients, but for DE with arbitrary (well-behaved) coefficients $p(t)$, $q(t)$;
  3. This result was due to the brilliant but tragic Norwegian mathematician Niels Henrik Abel (1802-1829). He died before he reached the age of 27;
  4. Abel's result when combined with the last Theorem says that if there is a point $t_0$ in the open interval $I$ where $W(y_1, y_2)(t_0) \neq 0$, then $W(y_1, y_2)(t) \neq 0$ for all $t$ in $I$;
  5. We can conclude that $W(y_1, y_2)(t) \equiv 0$ occurs if and only if $y_1$ and $y_2$ are "linearly independent over $I$".

Exercise (B&D) — Abel's Formula Applications
  1. Find the Wronskian of the Bessel equation of order $\nu$ $$x^2 y'' + xy' + (x^2 - \nu^2)y = 0$$ without solving for its solutions. (Ans. $c/x$ $(c \neq 0)$)
  2. Same question for the Legendre's differential equation $$(1 - x^2)y'' - 2xy' + \alpha(\alpha + 1)y = 0 \quad (\alpha \text{ is a constant}).$$ (Ans. $c/(1 - x^2)$ $(c \neq 0)$)
  3. If the differential equation $ty'' + 2y' + te^t y = 0$ has $y_1$ and $y_2$ as a fundamental set of solutions and if $W(y_1, y_2)(1) = 2$, find the value of $W(y_1, y_2)(5)$. (Ans. $2/25$)

4   Linear Independence

4.1   Definition

Definition — Linear Independence

Let $y_1$ and $y_2$ be two functions over $I$. We define two functions $y_1$ and $y_2$

  1. to be linearly independent over $I$ if $$c_1 y_1(t) + c_2 y_2(t) = 0, \quad \text{all } t \text{ in } I,$$ then $c_1 = c_2 = 0$.
  2. to be linearly dependent over an open set $I$ if $$c_1 y_1(t) + c_2 y_2(t) = 0, \quad \text{all } t \text{ in } I,$$ but $c_1$, $c_2$ do not vanish simultaneously.

4.2   Main Theorem

Theorem — Wronskian and Linear Independence

$W(y_1, y_2)(t) \neq 0$ over $I$, if and only if $y_1$ and $y_2$ are linearly independent over $I$.

Proof. Suppose $W(y_1, y_2)(t) \neq 0$ over $I$ but $y_1$ and $y_2$ are linearly dependent over $I$. That is $$c_1 y_1(t) + c_2 y_2(t) = 0, \quad \text{all } t \text{ in } I,$$

and $c_1$, $c_2$ do not vanish simultaneously. We differentiate the equation once

$$c_1 y_1'(t) + c_2 y_2'(t) = 0, \quad \text{all } t \text{ in } I.$$

We use the first equation to eliminate $c_1$ from the second equation. This yields

$$0 = -c_2(y_2/y_1) \cdot y_1' + c_2 y_2' = c_2 \frac{y_1 y_2' - y_2 y_1'}{y_1}.$$

Since $y_1$ is not identically zero, so we must have

$$0 = c_2(y_1 y_2' - y_2 y_1') = c_2 W(y_1, y_2)(t).$$

But $W(y_1, y_2)(t) \neq 0$ over $I$. Hence $c_2 = 0$. But then $c_1 = 0$. This contradicts the assumption that $c_1$, $c_2$ do not vanish simultaneously. Hence $y_1$, $y_2$ are linearly independent over $I$.

Conversely, suppose $y_1$, $y_2$ are linearly independent. Then we want to prove $W(y_1, y_2)(t) \neq 0$ over $I$. We could prove this statement if we prove the contrapositive statement instead. That is, if there is a $t_0$ in $I$ such that $W(y_1, y_2)(t_0) = 0$, then $y_1$, $y_2$ are linearly dependent. But Abel's formula implies that $W(y_1, y_2)(t) \equiv 0$. Hence

$$c_1 y_1(t) + c_2 y_2(t) = 0, \quad \text{all } t \text{ in } I,$$

and

$$c_1 y_1'(t) + c_2 y_2'(t) = 0, \quad \text{all } t \text{ in } I,$$

hold simultaneously. The same argument shows that

$$0 = c_2(y_1 y_2' - y_2 y_1') = c_2 W(y_1, y_2)(t).$$

But this time $W(y_1, y_2)(t) \equiv 0$ so that $c_2$ can take arbitrary non-zero value. Hence $y_1$, $y_2$ are linearly dependent, as required.

This means that if $\{y_1, y_2\}$ is a fundamental set of solutions of the DE, then they must be linearly independent.


4.3   Examples

Example (revisited). Let $y_1 = e^{r_1 t}$, $y_2 = e^{r_2 t}$. Then

$$W(y_1, y_2)(t) = (r_1 - r_2)e^{(r_2 + r_1)t} \neq 0$$

everywhere, provided that $r_1 \neq r_2$. Hence $\{e^{r_1 t}, e^{r_2 t}\}$ is a fundamental set of solutions to the DE

$$(D - r_1)(D - r_2)y = 0,$$

where $D = \frac{d}{dt}$.


Example (revisited). Let $y_1 = \cos t$, $y_2 = \sin t$. Then

$$W(y_1, y_2)(t) = \cos^2 t + \sin^2 t = 1,$$

for all $t$. So $\{\cos t, \sin t\}$ is a fundamental set of solutions to the DE

$$(D + i)(D - i)y = (D^2 + 1)y = y'' + y = 0,$$

where $D = \frac{d}{dt}$.


Example (Linear dependence). Test if the $\{\sin t, \cos(t - \frac{\pi}{2})\}$ is a fundamental set for some DE.

It is easy to check that one has

$$W\big(\sin t, \cos(t - \tfrac{\pi}{2})\big) = 0$$

for all $t$. Hence the $\{\sin t, \cos(t - \frac{\pi}{2})\}$ is not a fundamental set for any DE.


Example. Show that

$$2t^2 y'' + 3ty' - y = 0, \quad t > 0,$$

admits a fundamental set of solutions $\{y = t^{1/2}, y_2 = t^{-1}\}$.

We leave the readers to verify that the $y_1$, $y_2$ are both solutions to the DE. It remains to verify

$$W(y_1, y_2)(t) = \begin{vmatrix} t^{1/2} & t^{-1} \\ \frac{1}{2}t^{-1/2} & -t^{-2} \end{vmatrix} = -t^{-3/2} - \frac{1}{2}t^{-3/2} = -\frac{3}{2}t^{-3/2} \neq 0$$

provided $t > 0$ (or $t < 0$). So the $y_1$, $y_2$ is a fundamental set of solutions.


Example. We recall that the DE

$$ay'' + by' + cy = 0,$$

admits two characteristic roots

$$r_{1,2} = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a} = \lambda \pm i\mu$$

and $b^2 - 4ac < 0$, admits two solutions $e^{\lambda}\cos\mu$ and $e^{\lambda}\sin\mu$. We compute their Wronskian:

\begin{align} W(e^{\lambda t}\cos\mu t,\; e^{\lambda t}\sin\mu t) &= \begin{vmatrix} e^{\lambda t}\cos\mu t & e^{\lambda t}\sin\mu t \\ (e^{\lambda t}\cos\mu t)' & (e^{\lambda t}\sin\mu t)' \end{vmatrix} \\[6pt] &= \begin{vmatrix} e^{\lambda t}\cos\mu t & e^{\lambda t}\sin\mu t \\ e^{\lambda t}(\lambda\cos\mu t - \mu\sin\mu t) & e^{\lambda t}(\lambda\sin\mu t + \mu\cos\mu t) \end{vmatrix} \\[6pt] &= \mu e^{2\lambda t} \end{align}

which is non-vanishing throughout the real axis. Hence $\{e^{r_1 t}, e^{r_2 t}\}$ forms a fundamental set of solutions to the DE even if the $r_1$, $r_2$ are complex (conjugates).

5   Fundamental Sets of Solutions

The following theorem guarantees that a given second order linear differential equation must have a fundamental set of solutions.

Theorem — Existence of Fundamental Set

The DE

$$L(y) = y'' + p(t)y' + q(t)y = 0,$$

where $p(t)$ and $q(t)$ are continuous on an open interval $I$ such that for an arbitrarily chosen $t_0$ in $I$ always admit a fundamental set of solutions $\{y_1, y_2\}$ the following initial conditions hold

\begin{align} y_1(t_0) = 0, \quad &y_1'(t_0) = 1, \\[6pt] y_2(t_0) = 1, \quad &y_2'(t_0) = 0. \end{align}
Proof. The existence and uniqueness theorem above guarantees that each of the two initial value problems \begin{align} L(y) = y'' + p(t)y' + q(t)y &= 0, \quad y(t_0) = 1,\; y'(t_0) = 0, \\[6pt] L(y) = y'' + p(t)y' + q(t)y &= 0, \quad y(t_0) = 0,\; y'(t_0) = 1, \end{align}

admit a unique solution. Let them be $y_1$ and $y_2$ respectively. But then

$$W(y_1, y_2)(t) = \begin{vmatrix} y_1(t_0) & y_2(t_0) \\ y_1'(t_0) & y_2'(t_0) \end{vmatrix} = \begin{vmatrix} 1 & 0 \\ 0 & 1 \end{vmatrix} = 1.$$

Example (B&D). Find the fundamental set of solutions $y_1$, $y_2$ of the DE

$$y'' - y = 0,$$

that is described by the existence of fundamental set theorem.

It is a standard procedure to check that both of $y_1 = e^t$, $y_2 = e^{-t}$ forms a fundamental set (of solutions) to the DE: $W(e^t, e^{-t})(t) = -2 \neq 0$. The remark after the IVP theorem guarantees with the initial point $t_0 = 0$ that there are constants $c_1$, $c_2$ and $d_1$, $d_2$ so that the expected fundamental set

\begin{align} y_3(t_0) = 0, \quad &y_3'(t_0) = 1, \\[6pt] y_4(t_0) = 1, \quad &y_4'(t_0) = 0. \end{align}

can be written in the form

\begin{align} y_3(t) &= c_1 e^t + c_2 e^{-t}, \\[6pt] y_4(t) &= d_1 e^t + d_2 e^{-t}. \end{align}

Indeed, it is a simple exercise to verify that $c_1 = \frac{1}{2}$, $c_2 = \frac{1}{2}$ and $d_1 = \frac{1}{2}$, $d_2 = -\frac{1}{2}$. That is,

$$W(y_3, y_4)(t) = W(y_3, y_4)(t) = \cosh^2 t - \sinh^2 t = 1.$$ $$W(\cosh t, \sinh t) = \begin{vmatrix} \cosh t & \sinh t \\ \sinh t & \cosh t \end{vmatrix} = \cosh^2 t - \sinh^2 t = 1.$$

So the $\{\cosh t, \sinh t\}$ forms a fundamental set of solutions that meets the question asked.


Example (Revisited). Recall that $y_1(t) = e^{-2t}$, $y_2(t) = e^{-3t}$ are two solutions of the DE

$$y'' + 5y' + 6 = 0$$

with

$$W(f, g)(t) = -e^{-5t} \neq 0.$$

Thus the $y_1(t) = e^{-2t}$, $y_2(t) = e^{-3t}$ forms a fundamental set of solutions to differential equation. So every solution $y$ of the DE on the real-axis can be written in the form $y(t) = c_1 e^{-2t} + c_2 e^{-3t}$ for some constants $c_1$, $c_2$.

6   Exercises

Exercise (B&D)
  1. Find the fundamental set of solutions $y_1$, $y_2$ of $$y'' + y' - 2y = 0, \quad t_0 = 0$$ such that the $y_1$, $y_2$ meet with the initial conditions as specified in existence of fundamental solutions theorem. (Ans. $y_1 = \frac{1}{3}e^{-2t} + \frac{2}{3}e^{t}$, $y_2 = -\frac{1}{3}e^{-2t} + \frac{1}{3}e^{t}$)
  2. Consider the equation $y'' - y' - 2y = 0$.
    • Show that $y'(t) = e^{-t}$ and $y_2(t) = e^{2t}$ form a fundamental set of solutions. (Ans. Yes)
    • Let $y_3(t) = -2e^{2t}$, $y_4(t) = y_1(t) + 2y_2(t)$, and $y_5(t) = 2y_1(t) - 2y_3(t)$. Are $y_3(t)$, $y_4(t)$, and $y_5(t)$ also solutions of the given differential equation? (Ans. Yes)
    • Determine whether each of the following pairs forms a fundamental set of solutions: $[y_1(t), y_3(t)]$; $[y_2(t), y_3(t)]$; $[y_1(t), y_4(t)]$; $[y_4(t), y_5(t)]$. (Ans. $[y_1(t), y_3(t)]$ and $[y_1(t), y_4(t)]$ are fundamental sets of solutions)
Exercise — Fundamental Sets

Let $\{y_1, y_2\}$ be a fundamental set of a second order ODE. Let $y_3 = a_1 y_1 + a_2 y_2$ and $y_4 = a_3 y_1 + a_4 y_2$. Show

$$W(y_3, y_4) = (a_1 b_2 - a_2 b_1)\,W(y_1, y_2).$$

Discuss under what condition that $\{y_3, y_4\}$ can be a fundamental set of the same second order ODE.

7   Practice MCQ

Practice 1: Compute a Wronskian

Let $y_1 = e^{2t}$ and $y_2 = e^{-3t}$. Compute the Wronskian $W(y_1, y_2)(t)$.

PHASE 0 Recall the Formula

The Wronskian $W(f, g)$ is defined as:

Good! Now let's compute step by step...
PHASE A Find the Derivatives

What is $y_1' = \frac{d}{dt}(e^{2t})$?

What is $y_2' = \frac{d}{dt}(e^{-3t})$?

Now compute the determinant...
PHASE B Compute the Wronskian
Step B1

$W = y_1 y_2' - y_1' y_2 = e^{2t}(-3e^{-3t}) - (2e^{2t})(e^{-3t}) = ?$

Step B2

Since $W(y_1, y_2) = -5e^{-t} \neq 0$ for all $t$, what can we conclude?

Excellent! $W(e^{2t}, e^{-3t}) = -5e^{-t}$. Since $W \neq 0$, they form a fundamental set of solutions.
Practice 2: Apply Abel's Formula

Consider the DE $y'' + 4y' + 3y = 0$. If $y_1, y_2$ are solutions with $W(y_1, y_2)(0) = 2$, find $W(y_1, y_2)(t)$ using Abel's formula.

PHASE 0 Recall Abel's Formula

For $y'' + p(t)y' + q(t)y = 0$, Abel's formula states:

Good! Now identify $p(t)$ and apply the formula...
PHASE A Identify $p(t)$

In the DE $y'' + 4y' + 3y = 0$, what is $p(t)$?

What is $\displaystyle\int p(t)\,dt = \int 4\,dt$?

Now apply Abel's formula...
PHASE B Find the Wronskian
Step B1

By Abel's formula, $W(t) = c \cdot e^{-4t}$. Using $W(0) = 2$, find $c$:

Step B2

Therefore, $W(y_1, y_2)(t) = ?$

Excellent! By Abel's formula, $W(y_1, y_2)(t) = 2e^{-4t}$. Note that $W \neq 0$ for all $t$, so $y_1, y_2$ are linearly independent everywhere.
Practice 3: Check Linear Independence

Determine whether $y_1 = \sin t$ and $y_2 = \cos(t - \frac{\pi}{2})$ form a fundamental set of solutions.

PHASE 0 Simplify $y_2$

Using the identity $\cos(\theta - \frac{\pi}{2}) = \sin\theta$, what is $y_2$?

So $y_1$ and $y_2$ are the same function! What does this mean?
PHASE A Compute the Wronskian

If $y_1 = \sin t$ and $y_2 = \sin t$, what is $W(y_1, y_2)$?

Since $W(y_1, y_2) \equiv 0$, what can we conclude?

Final conclusion...
PHASE B Conclusion
Final Answer

Do $\{\sin t, \cos(t - \frac{\pi}{2})\}$ form a fundamental set?

Correct! $\{\sin t, \cos(t - \frac{\pi}{2})\}$ is NOT a fundamental set because $\cos(t - \frac{\pi}{2}) = \sin t$, making them linearly dependent. A fundamental set for $y'' + y = 0$ would be $\{\sin t, \cos t\}$.

— End of Wronskians and Independence Notes —